Microscopic theory of intrinsic timescales in spiking neural networks
نویسندگان
چکیده
A complex interplay of single-neuron properties and the recurrent network structure shapes activity cortical neurons. The statistics differ in general from respective population statistics, including spectra and, correspondingly, autocorrelation times. We develop a theory for self-consistent second-order block-structured sparse random networks spiking In particular, predicts neuron-level times, also known as intrinsic timescales, neuronal activity. is based on an extension dynamic mean-field rate to networks, which validated via simulations. It accounts both static variability, e.g., due distributed number incoming synapses per neuron, temporal fluctuations input. apply balanced generalized linear model neurons, leaky integrate-and-fire biologically constrained For with error function nonlinearity, novel analytical solution colored noise problem allows us obtain firing distributions, power spectra, timescales. we derive approximate problem, Stratonovich approximation Wiener-Rice series free upcrossing statistics. Again closing system self-consistently, fluctuation-driven regime, this yields reliable estimates mean its variance across interspike-interval distribution, With help our theory, find parameter regimes where timescale significantly exceeds membrane time constant, indicates influence dynamics. Although resulting timescales are same order neurons two systems fundamentally: former, longer arises increased probability after spike; latter, it consequence prolonged effective refractory period decreased probability. Furthermore, attains maximum at critical synaptic strength contrast minimum found networks.
منابع مشابه
Learning Multiple Timescales in Recurrent Neural Networks
Recurrent Neural Networks (RNNs) are powerful architectures for sequence learning. Recent advances on the vanishing gradient problem have led to improved results and an increased research interest. Among recent proposals are architectural innovations that allow the emergence of multiple timescales during training. This paper explores a number of architectures for sequence generation and predict...
متن کاملCell Microscopic Segmentation with Spiking Neuron Networks
Spiking Neuron Networks (SNNs) overcome the computational power of neural networks made of thresholds or sigmoidal units. Indeed, SNNs add a new dimension, the temporal axis, to the representation capacity and the processing abilities of neural networks. In this paper, we present how SNN can be applied with efficacy for cell microscopic image segmentation. Results obtained confirm the validity ...
متن کاملModeling spiking neural networks
A notation for the functional specification of a wide range of neural networks consisting of temporal or non-temporal neurons, is proposed. The notation is primarily a mathematical framework, but it can also be illustrated graphically and can be extended into a language in order to be automated. Its basic building blocks are processing entities, finer grained than neurons, connected by instant ...
متن کاملEmbedded spiking neural networks
In this paper we introduce the ongoing research at our department concerning hardware implementations of spiking neural networks on embedded systems. Our goal is to implement a spiking neural network in reconfigurable hardware, more specifically embedded systems. Keywords— Hardware neural networks, embedded sys-
متن کاملMean Field Theory for Random Recurrent Spiking Neural Networks
Recurrent spiking neural networks can provide biologically inspired model of robot controller. We study here the dynamics of large size randomly connected networks thanks to ”mean field theory”. Mean field theory allows to compute their dynamics under the assumption that the dynamics of individual neuronsare stochastically independent. We restrict ourselves to the simple case of homogeneous cen...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Physical review research
سال: 2021
ISSN: ['2643-1564']
DOI: https://doi.org/10.1103/physrevresearch.3.043077